Skip to main content

Apple’s upcoming AI smart glasses are starting to sound a lot more exciting

Earlier this week, Bloomberg reported that Apple will be “accelerating” the development of three upcoming AI wearables: smart glasses, a pendant, and AirPods with cameras. These three products are all meant to integrate Siri deeper into our everyday lives, and I’ll be focusing on the glasses aspect of it.

Obviously, Apple has always had an ambition to make glasses. The idea of Apple making AR glasses has been in the rumor mill for ages. For now, that’s on pause – and Vision Pro will have to do. In the meanwhile, the company is pursuing AI glasses, similar to Meta Ray-Bans.

Meta Ray-Bans have been a hit since they launched towards the end of 2023. Despite everyone knowing Meta for being a privacy-invading company, loads of people have been excited to have a pair of glasses from Meta with cameras on them. At their core, they have cameras, microphones, and speakers – allowing users to talk to Meta AI about their surroundings, listen to music, or take photos and videos.

Ray Ban Meta

Starting last year, we begun to hear that Apple was working towards releasing its own version of Meta Ray-Bans in the coming year. Now, development has advanced significantly.

Apple is reportedly integrating two camera lenses: one for computer vision, and another for taking photos and videos. The company has also figured out how to embed all of the components in the frame, when initially they were planning on relying on an external battery.

Apple has a new trick up its sleeve

When talking about products that rely on voice for communication, it’s always been easy to think about how it might not always be practical to communicate out loud. This is why I largely don’t use any of the voice assistant features on my Meta Ray-Bans.

Recently, though, Apple acquired a new startup for $2 billion: Q.ai.

While we don’t know loads about this company, we do know one thing quite clearly – it specialized in machine learning systems for interpreting silent voice input.

Right now, if you want to speak to a voice assistant, it has to be pretty audible. Even whispers can be hard at time for certain voice models, especially when you aren’t in a completely silent environment. This new acquisition could solve that.

Q.ai also researched systems for interpreting micro facial movements, enabling them to understand speech without it being audible at all.

If this can all come together nicely, I think Apple Glasses will be incredibly appealing to loads of people, and might make people take voice assistants more seriously.

Wrap up

Top comment by djos

Liked by 1 people

I'd be curious to try them - but I had a torn retina in one eye + some resulting scar tissue, have had cataract lenses in both eyes - each eye is a different focal length, and I wear progressive prescription lenses. It might be a bridge too far for Apple lenses to keep up with me.

View all comments

Ultimately, I’m sure Meta Ray-Bans will end up being much cheaper than Apple’s AI glasses. However, if Apple is truly able to stick the landing with next-level speech recognition technology, I think a lot of people will be willing to overlook the price difference. While we don’t know a concrete release date, it seems likely that they’ll be released within the next year or so.

What do you think of Apple’s AI glasses? Are you excited for them, or will you be skipping? Let us know in the comments.


My favorite Apple accessory recommendations:

Follow Michael: X/TwitterBlueskyInstagram

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Mac — experts who break news about Apple and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Mac on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel